home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Ian & Stuart's Australian Mac: Not for Sale
/
Another.not.for.sale (Australia).iso
/
hold me in your arms
/
Virtual-Worlds
/
commercial
/
Cyberfinger-NikkeiWeekly
< prev
next >
Wrap
Text File
|
1993-12-16
|
11KB
|
234 lines
1. The Nikkei Weekly, November 15, 1993, SCIENCE & TECHNOLOGY; Page 13, 561
words, For Cyberfingers, it's all in the wrist action Neural Chip Removes Need
For Data Glove
2. The Nikkei Weekly, April 12, 1993, MANAGEMENT & LABOR; Science Technology;
Pg. 10, 1022 words, Computer scientists try thought control; Myoelectric
potentials seen as stepping stones to 'brainy' machines, THE NIKKEI BUSINESS
The Nikkei Weekly
November 15, 1993
SECTION: SCIENCE & TECHNOLOGY; Page 13
LENGTH: 561words
HEADLINE: For Cyberfingers, it's all in the wrist action
Neural Chip Removes Need For Data Glove
BODY:
As high-tech gadgetry goes, the data glove is hard to beat. When
you put the data glove on your hand, the means to control virtual
reality is literally at your fingertips.
But the data glove is bulky. Is there no easier way to get a handle on
cyberspace?
Yes, the Cyberfinger, say researchers at the Human Interface Laboratories
of Nippon Telegraph and Telephone Corp.
NTT's Cyberfinger is a control technology, designed to govern the
movements of a robot hand in master-slave fashion, as well as to allow
a user to interact with computers and virtual reality environments.
The NTT team also foresees applications in the field of prosthetics,
where the technology could be used to control an artificial hand that
responds to the wearer's intentions.
The Cyberfinger's control technology has two main components. One is a
two-electrode sensor which captures electrical signals generated by muscles in
the wrist. The other is a neural chip, a type of integrated circuit that can
learn to recognize patterns.
This neural chip is first taught how the signals coming from the wrist sensor
correspond to actual movements of the hand and fingers. Once it has learned, it
can then forward the signals on as commands to move a robot hand, or a virtual
hand in cyberspace.
Myoelectric potential
The Cyberfinger takes advantage of the fact that the muscles which control
finger movement all converge at the wrist; the signals from the brain that
course down nerves to activate hand movements trigger muscles here. At the
wrist, as the muscles contract, a type of electric potential known as a
The Nikkei Weekly, November 15, 1993
myoelectric potential is generated. The wrist sensor picks up the changes in
this myoelectric potential.
In order to teach the neural chip how these electric signals correspond to
actual hand and finger movements, a person wears the wrist sensor and a data
glove at the same time, and moves his hand and fingers around. The neural chip
compares the two sets of data - one coming from the sensor, and the other from
the data glove, which senses movements in the first and second joints of each
finger.
In tests, NTT says the neural chip was able to learn the rules linking
myoelectric potential and finger movement in just two minutes.
After that, the data glove was removed, and the neural chip alone was able to
direct the movement of a robot hand, based on the signals received from the
wrist sensor alone.
NTT says the system can control complex finger movements, and not just simple
opening and closing of the fist. The robot fingers bend like the human fingers
bend, imitating the angle of bending with an error of less than 20 degrees on
average, the company claims.
That degree of error, while seemingly large, is actually a major
accomplishment. The changes in myoelectric potential which accompany wrist
movement differ from person to person, and even within the same person from time
to time. Developing a neural chip capable of sorting out the noise and
determining the underlying patterns was a technical feat in and of itself.
"We were pleasantly surprised that such accurate movement could be achieved
with just two electrodes," admitted Akira Hiraiwa, director of the Human
Interfac Laboratories. "Our next goal is to improve the system so it can
recognize the relative strength of finger movements, and not just the angle of
bending."
The Nikkei Weekly
April 12, 1993
SECTION: MANAGEMENT & LABOR; Science Technology; Pg. 10
LENGTH: 1022 words
HEADLINE: Computer scientists try thought control;
Myoelectric potentials seen as stepping stones to 'brainy' machines
BYLINE: THE NIKKEI BUSINESS
BODY:
Today's computers are controlled by input from a keyboard, a mouse, a pen.
Soon they will be equipped with voice-recognition capabilities. But some
engineers are looking to the day when machines can be controlled directly by
thought.
At the Human Interface Laboratories of Nippon Telegraph and Telephone Corp. a
researcher curls and stretches his fingers, and a robot hand faithfully mimics
his actions.
This "slave hand" is controlled by a sensor strapped to his wrist. The
sensor picks up the extremely small electric currents -- called myoelectric
potentials -- that occur in muscle tissue as the fingers move. This information
is processed to determine not only whether a fingers move. This information is
processed to determine not only whether a finger has been curled or extended but
also with how much force. this is then used to control the movement of the
robot hand.
NTT's " cyberfinger" project is much more involved than you might think.
The strength of these myoelectric potentials differ not only among individuals
but also within any given individual at any given time. Even if a person
consciously attempts to bend a finger with the same force as before, minute
differences in the myoelectric potentials are generated.
To compensate for this, data is processed by a neural network capable of
dealin with subtle variations in the myoelectric potentials and making accuratei
nferences about the person's intention.
Ghost in the machine
"The research is a step toward the ultimate goal of transferring human
thoughts naturally to machines," Akira Hiraiwa, director of the laboratories,
explains.
Paralleling his work with the cyber-finger, Hiraiwa is investigating whether
thoughts can be used as input signals to a computer.
Human speech and hand movements register as changes in brain wave activity
slightly before the muscles actually move. If these "preparatory electric
potentials" can be detected, it should be possible to recognize an intention to
act.
In one set of experiments, an NTT research group attached electrodes to the
heads of test subjects and recorded brain wave patterns as the subjects
pronounced one of two sounds: "ahh" or "ooh."
The recordings showed that electric potentials begin to change about one
second before the sounds are articulated. By analyzing these preparatory
electric potentials, it was possible to predict which sound the subjects were
about to say.
This, too, was much more involved than you might think. The brain houses an
extremely large number of neurons, many acting simultaneously to process sights
and sounds, to coordinate movements and such. All of this neuronal activity is
manifest in brain waves, so it is extremely difficult to tease out only signals
of interest.
In order to determine which electric potentials in the brain were associated
with the test subjects' preparation to pronounce each of the two sounds, the NTT
researchrs relied on computer processing using a neural network.
First they taught the neural network to recognize brain wave patterns known
to be associated with the pronunciation of "ahh" and "ooh." With practice, the
network eventually learned to recognize with 100% accuracy both sounds before
they were uttered, based solely on changes in preparatory electric potentials.
Similar experiments were conducted with subjects manipulating a joystick. In
this case, the neural network eventually managed to predict with 60-80% accuracy
whether the subjects were going to move their hand to the left, right or center.
However, the whole process in both sets of experiments was extremely time
consuming. "Even using a supercomputer, it can take hours to teach the neural
network," Hiraiwa admits.
Another shortcoming with the NTT experiments is that they both concentrated
on detecting changes in brain wave patterns prior to a change in muscle
activity, be it a hand motion or a vocalization. These changes do not arise
unless the person is actually going to follow through with an action.
Why go through the trouble of designing a computer that responds a priori to
brain activity that, by its very nature, must be carried out physically anyway?
Why not go one step further and develop a system that can recognize thoughts not
tied to muscle action?
Tapping the inner voice
That is exactly what researchers at the Fujitsu laboratory have set out to
do. A group led by Norio Jujimaki is trying to tap into the inner voice humans
use, for example, when reading to themselves. No muscles move during this
silent speech, so to study it is to directly observe human thoughts.
In one group of experiments, subjects were asked to silently voice the sound
"ahh" whenever a light wasturned on in front of them. The subjects' brain wavep
atterns were recorded throughout the tests. Analysis of these recordings
revealed that negative electric potentials are generated around the periphery of
the frontal lobe just after the light turns on.
Building on this work, the Fujitsu researchers have now turned their
attention to the extremely weak magnetic fields emitted by the brain.
Using a highly sensitive detector known as a superconducting quantum
interference device, the group is trying to identify precisely which part of the
brain is involved in silent speech.
However, the work has barely begun, and it might be a long time before any
fruit can be harvested, because the tests are extremely demanding on the
subjects.
In the first series of experiments, which took 10 hours to complete, eight
subjects underwent 50 to 100 trials and the recordings were averaged together to
isolate signals related to silent speech from background noise. The only result
of all this effort was evidence that brain wave patterns are different when
subjects silently speak the "ahh" sound.
"It's still too early to say whether we are really extracting information
related to thought from the brain," group leader Shinya Hasuo admits. "But if
we can differentiate between a silent 'yes' and a silent 'no' after three years
of research, I'll consider the project a success."
GRAPHIC: Picture, The ultimate brain wave, Sources: Nikkei Business
Publications, NTT